Probabilistic Independence Networks for Hidden Markov Probability Models
نویسندگان
چکیده
Graphical techniques for modeling the dependencies of random variables have been explored in a variety of different areas, including statistics, statistical physics, artificial intelligence, speech recognition, image processing, and genetics. Formalisms for manipulating these models have been developed relatively independently in these research communities. In this paper we explore hidden Markov models (HMMs) and related structures within the general framework of probabilistic independence networks (PINs). The paper presents a self-contained review of the basic principles of PINs. It is shown that the well-known forward-backward (F-B) and Viterbi algorithms for HMMs are special cases of more general inference algorithms for arbitrary PINs. Furthermore, the existence of inference and estimation algorithms for more general graphical models provides a set of analysis tools for HMM practitioners who wish to explore a richer class of HMM structures. Examples of relatively complex models to handle sensor fusion and coarticulation in speech recognition are introduced and treated within the graphical model framework to illustrate the advantages of the general approach.
منابع مشابه
Probabilistic Inference in Credal Networks: New Complexity Results
Credal networks are graph-based statistical models whose parameters take values in a set, instead of being sharply specified as in traditional statistical models (e.g., Bayesian networks). The computational complexity of inferences on such models depends on the irrelevance/independence concept adopted. In this paper, we study inferential complexity under the concepts of epistemic irrelevance an...
متن کاملImage Segmentation using Gaussian Mixture Model
Abstract: Stochastic models such as mixture models, graphical models, Markov random fields and hidden Markov models have key role in probabilistic data analysis. In this paper, we used Gaussian mixture model to the pixels of an image. The parameters of the model were estimated by EM-algorithm. In addition pixel labeling corresponded to each pixel of true image was made by Bayes rule. In fact,...
متن کاملBelief networks, hidden Markov models, and Markov random fields: A unifying view
The use of graphs to represent independence structure in multivariate probability models has been pursued in a relatively independent fashion across a wide variety of research disciplines since the beginning of this century. This paper provides a brief overview of the current status of such research with particular attention to recent developments which have served to unify such seemingly dispa...
متن کاملFrom Probabilistic Horn Logic to Chain Logic
Probabilistic logics have attracted a great deal of attention during the past few years. Where logical languages have, already from the inception of the field of artificial intelligence, taken a central position in research on knowledge representation and automated reasoning, probabilistic graphical models with their associated probabilistic basis have taken up in recent years a similar positio...
متن کاملTraining Generalized Hidden Markov Model With Interval Probability Parameters
Recently generalized interval probability was proposed as a new mathematical formalism of imprecise probability. It provides a simplified probabilistic calculus based on its definitions of conditional probability and independence. The Markov property can be described in a form similar to classical probability. In this paper, an expectation-maximization approach is developed to train generalized...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Neural computation
دوره 9 2 شماره
صفحات -
تاریخ انتشار 1997